Robust variable selection for nonlinear models with diverging number of parameters
نویسندگان
چکیده
منابع مشابه
Model Selection for Correlated Data with Diverging Number of Parameters
High-dimensional longitudinal data arise frequently in biomedical and genomic research. It is important to select relevant covariates when the dimension of the parameters diverges as the sample size increases.We propose the penalized quadratic inference function to perform model selection and estimation simultaneously in the framework of a diverging number of regression parameters. The penalize...
متن کاملShrinkage Tuning Parameter Selection with a Diverging Number of Parameters
Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g., LASSO, SCAD, etc) are found particularly useful for the purpose of variable selection (Fan and Peng, 2004; Huang et al., 2007b). Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate select...
متن کاملGeneral Estimating Equations: Model Selection and Estimation with Diverging Number of Parameters
This paper develops adaptive elastic net estimator for general estimating equations. We allow for number of parameters diverge to infinity. The estimator can also handle collinearity among large number of variables as well. This method has the oracle property, meaning we can estimate nonzero parameters with their standard limit and the redundant parameters are dropped from the equations simulta...
متن کاملThe adaptive Gril estimator with a diverging number of parameters
We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (AdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines ...
متن کاملNonconcave Penalized Likelihood with a Diverging Number of Parameters By
A class of variable selection procedures for parametric models via nonconcave penalized likelihood was proposed by Fan and Li to simultaneously estimate parameters and select important variables. They demonstrated that this class of procedures has an oracle property when the number of parameters is finite. However, in most model selection problems the number of parameters should be large and gr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistics & Probability Letters
سال: 2014
ISSN: 0167-7152
DOI: 10.1016/j.spl.2014.04.013